Combining Bagging and Boosting
نویسندگان
چکیده
Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of classifiers using the same learning algorithm for the base-classifiers. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in this work we built an ensemble using a voting methodology of bagging and boosting ensembles with 10 subclassifiers in each one. We performed a comparison with simple bagging and boosting ensembles with 25 sub-classifiers, as well as other well known combining methods, on standard benchmark datasets and the proposed technique was the most accurate. Keywords— data mining, machine learning, pattern recognition.
منابع مشابه
The Role of Combining Rules in Bagging and Boosting
To improve weak classifiers bagging and boosting could be used. These techniques are based on combining classifiers. Usually, a simple majority vote or a weighted majority vote are used as combining rules in bagging and boosting. However, other combining rules such as mean, product and average are possible. In this paper, we study bagging and boosting in Linear Discriminant Analysis (LDA) and t...
متن کاملCombining Bagging and Additive Regression
Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of regression models using the same learning algorithm as base-learner. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in t...
متن کاملCombining Bagging, Boosting and Random Subspace Ensembles for Regression Problems
Bagging, boosting and random subspace methods are well known re-sampling ensemble methods that generate and combine a diversity of learners using the same learning algorithm for the base-regressor. In this work, we built an ensemble of bagging, boosting and random subspace methods ensembles with 8 sub-regressors in each one and then an averaging methodology is used for the final prediction. We ...
متن کاملCombining Bias and Variance Reduction Techniques for Regression Trees
Gradient Boosting and bagging applied to regressors can reduce the error due to bias and variance respectively. Alternatively, Stochastic Gradient Boosting (SGB) and Iterated Bagging (IB) attempt to simultaneously reduce the contribution of both bias and variance to error. We provide an extensive empirical analysis of these methods, along with two alternate bias-variance reduction approaches — ...
متن کاملMultiple Boosting: a Combination of Boosting and Bagging
Classiier committee learning approaches have demonstrated great success in increasing the prediction accuracy of classiier learning , which is a key technique for datamining. These approaches generate several classiiers to form a committee by repeated application of a single base learning algorithm. The committee members vote to decide the nal classiication. It has been shown that Boosting and ...
متن کامل